4 research outputs found

    Automated analysis of inter-parameter dependencies in web APIs

    Get PDF
    Web services often impose constraintsthat restrict the way in which two or more input parameters can be combined to form valid calls to the service, i.e. inter-parameter dependencies. Current web API specification languages like the OpenAPI Specification (OAS) pro vide no support for the formal description of such dependencies, making it hardly possible to interact with the services without human intervention. We propose specifying and automatically ana lyzing inter-parameter dependencies in web APIs. To this end, we propose a domain-specific language to describe these dependencies, a constraint programming-aided tool supporting their automated analysis, and an OAS extension integrating our approach and eas ing its adoption. Together, these contributions open a new range of possibilities in areas such as source code generation and testin

    AI-driven web API testing

    Get PDF
    Testing of web APIs is nowadays more critical than ever before, as they are the current standard for software integration. A bug in an organization’s web API could have a huge impact both in ternally (services relying on that API) and externally (third-party applications and end users). Most existing tools and testing ap proaches require writing tests or instrumenting the system under test (SUT). The main aim of this dissertation is to take web API testing to an unprecedented level of automation and thoroughness. To this end, we plan to apply artificial intelligence (AI) techniques for the autonomous detection of software failures. Specifically, the idea is to develop intelligent programs (we call them “bots”) ca pable of generating hundreds, thousands or even millions of test inputs and to evaluate whether the test outputs are correct based on: 1) patterns learned from previous executions of the SUT; and 2) knowledge gained from analyzing thousands of similar programs. Evaluation results of our initial prototype are promising, with bugs being automatically detected in some real-world APIs.Ministerio de Economía y Competitividad BELI (TIN2015-70560-R)Ministerio de Ciencia, Innovación y Universidades RTI2018-101204-B-C21 (HORATIO)Ministerio de Educación, Cultura y Deporte FPU17/0407

    Changing the unchoking policy for an enhanced bittorrent

    No full text
    In this paper, we propose a novel optimistic unchoking approach for the BitTorrent protocol whose key objective is to improve the quality of inter-connections amongst peers. In turn, this yields enhanced data distribution without penalizing underutilized and/or idle peers. The suggested policy takes into consideration the number of peers currently interested in downloading from a client that is to be unchoked. Our conjecture is that clients having few peers interested in downloading data from them should be favored with optimistic unchoke intervals. This will enable the clients in question to receive data since they become unchoked faster and consequently, they will trigger the interest of additional peers. In contrast, clients with plenty of "interested" peers should enjoy a lower priority to be selected as "planned optimistic unchoked" as they likely have enough data to forward and have saturated their uplinks. In this context, we increase the aggregate probability that the swarm obtains a higher number of interested-in-cooperation and directly-connected peers leading to improved peer inter-connection. Experimental results indicate that our approach significantly outperforms the existing optimistic unchoking policy. © 2012 Springer-Verlag

    EnhancedBit: Unleashing the potential of the unchoking policy in the BitTorrent protocol

    No full text
    In this paper, we propose a modification to the BitTorrent protocol related to its peer unchoking policy. In particular, we apply a novel optimistic unchoking approach that improves the quality of inter-connections amongst peers, i.e., increases the number of directly-connected and interested-in-cooperation peers without penalizing underutilized and/or idle peers. Our optimistic unchoking policy takes into consideration the number of clients currently interested in downloading from a peer that is to be unchoked. Our conjecture is that peers having few clients interested in downloading data from them, should be favored with optimistic unchoke intervals. This enables the peers in question to receive data since they become unchoked faster and in turn, they will trigger the interest of additional clients. In contrast, peers with plenty of "interested" clients should enjoy a lower priority to be selected as planned optimistic unchoked, since these peers likely have enough data to forward; nevertheless, they receive enough data due to tit-for-tat peer reciprocation and are not in need of optimistic unchoking slots. Armed with this realization, we establish an analytical model and prove a significant performance improvement under our modified BitTorrent protocol. Experimental results, also, indicate that our approach significantly outperforms the existing optimistic unchoking policy in three important aspects: first, there is a higher number of interested-in-cooperation and directly-connected peers. Second, since leechers now act as data intermediaries, the load on seeders eases up considerably. Last, a shorter bootstrapping period for fresh peers is achieved. Hence, we claim that our approach helps implement an enhanced BitTorrent protocol and we name it "EnhancedBit"
    corecore